![]() Flexible-arm-oriented multi-modal human-machine interaction control method
专利摘要:
The present invention relates to the technical fields of human-machine interaction, artificial intelligence and mode recognition, and in particular to, a flexible-arm-oriented multi-modal human-machine interaction control method. The flexible-arm-oriented 5 multi-modal human-machine interaction control method in which a sensor, an upper machine and a flexible arm are adopted includes: step 1: acquiring data of an electromyographic signal of a controller, and preprocessing the electromyographic signal, step 2: extracting features from the preprocessed electromyographic signals to train a classifier model; step 3: wearing and calibrating the sensor by the controller, step 10 4: loading, by the upper machine, the classifier model trained in the step 2, and dynamically analyzing the electromyographic signal and IMU data with a detection algorithm to obtain a joint angle, and step 5: inputting the joint angle into a proportional- differential controller to control the flexible arm. 公开号:NL2027179A 申请号:NL2027179 申请日:2020-12-21 公开日:2021-04-06 发明作者:Duan Feng;Zheng Haosi;Yokoi Hiroshi 申请人:Nan Kai Univ; IPC主号:
专利说明:
[0001] [0001] The present invention relates to the technical fields of human-machine interaction, artificial intelligence and mode recognition, and in particular to, a flexible- arm-oriented multi-modal human-machine interaction control method.BACKGROUND [0002] [0002] With the continuous improvement of the industrial automation level, industrial mechanical arms are applied more and more extensively, and meanwhile, due to rapid changes in social production and life styles, traditional industrial mechanical arms are unable to meet the requirements of high compliance, intelligence and dexterity of production. Therefore, a new generation of flexible mechanical arm is generated on the market, and correspondingly, flexible-arm-oriented human-machine interaction interfaces by recognizing the action intention of the upper limb of a human body are also proposed successively. [0003] [0003] A surface electromyography (SEMG) signal is a non-invasive electrical signal collected from the skin surface of the human body, and electrical signals generated from several motor units are superposed to well reflect the motion intention of the human body. Information, such as the posture, the position, or the like, of a system may be estimated by an inertial measurement unit (IMU) signal from nine-axis posture data (accelerations, angular velocities and magnetic field intensities in x, y and z directions) without depending on external reference information. Due to the characteristics of easy acquisition, non-invasiveness, or the like, of the SEMG signal and the IMU signal, a new human-machine interaction control interface is gradually developed by a control mode of mixing modals of the two signals. [0004] [0004] In recent years, research for analyzing the sEMG signal and the IMU signal to acquire joint angles to control a mechanical arm has been carried out by related research organizations, and a Keehoon research group of the Korea Advanced Institute [0005] [0005] In order to overcome the defects in the prior art, the present invention provides a flexible-arm-oriented multi-modal human-machine interaction control method, which solves the two problems of a large sensor number and a long training time in previous methods. [0006] [0006] In order to achieve the above-mentioned object, the following technical solution is adopted in the present invention. [0007] [0007] A flexible-arm-oriented multi-modal human-machine interaction control method in which a sensor, an upper machine and a flexible arm are adopted includes: [0008] [0008] step 1: acquiring data of an electromyographic signal of a controller, and preprocessing the electromyographic signal; [0009] [0009] step 2: extracting features from the preprocessed electromyographic signals to train a classifier model; [0010] [0010] step 3: wearing and calibrating the sensor by the controller; [0011] [0011] step 4: loading, by the upper machine, the classifier model trained in the step 2, and dynamically analyzing the electromyographic signal and IMU data with a detection algorithm to obtain a joint angle; and [0012] [0012] step 5: inputting the joint angle into a proportional-differential controller to control the flexible arm. [0013] [0013] As further optimization of the present technical solution, the step 1 includes: acquiring data of the electromyographic signal of a gesture of the controller, and measuring the maximum electromyographic signal amplitude. [0014] [0014] As further optimization of the present technical solution, the preprocessing the electromyographic signal in the step 1 includes: extracting data of an active segment and performing a low-pass filtering operation using a Butterworth filter. [0015] [0015] As further optimization of the present technical solution, the step 2 includes: extracting and normalizing a feature vector of the electromyographic signal, inputting the feature vector into a neural network, and training the model with a Levenberg optimization algorithm. [0016] [0016] As further optimization of the present technical solution, three features of a mean absolute value, zero crossings and slope sign changes are extracted and normalized. [0017] [0017] As further optimization of the present technical solution, the step 4 includes: extracting the active segment of the electromyographic signal with a double- threshold detection algorithm, preprocessing data of the active segment, extracting features of the data, and inputting the features into the classifier model for identification. [0018] [0018] As further optimization of the present technical solution, the IMU data in the step 4 is identified with an IMU calculation model. [0019] [0019] As further optimization of the present technical solution, quaternion data [x, v, z, w] transmitted to the upper machine by an IMU is converted into a roll angle, a pitch angle and a yaw angle according to the following formula: 2yw—2xz roll =a tan) 1-2 —2z 2xw—2yz pitch=a tan 2) 1-2x°-2z YT — ar 1 2 vy YH [0020] [0020] yaw = aresin(2xy +2zw) (1) [0021] [0021] and data regularization is performed on the angles according to the following rules: IMU = IMU +360°, if AIMU, <—180° and Asgn, #0 IMU ‚360, if AIMU, >180° and Asgn, #0 IMU otherwise [0022] [0022] (2) [0023] [0023] wherein I "1s ( no PHC, ) ) at the moment #, [0024] [0024] AMU. = IMU, IMU _,, Asgn =sgn(/MU)-sgn(IMU ,) (3). [0025] [0025] As further optimization of the present technical solution, the controlling the flexible arm in the step 5 includes: calculating a joint angular position vector ©, forming a feedback error AQ =©—0 (4) by subtracting the joint angular position vector from the joint angle acquired in the step 4, inputting the feedback error into the proportional-differential controller, and outputting a moment control signal 1 r T=K,AO+K,AO+ g(0)+ f(O) (5), wherein K, and k, represent proportional and differential parameters of the controller, & ©) is a gravity compensation term, and Mn (©) isa friction compensation term. [0026] [0026] Different from the prior art, the above-mentioned technical solution has the following advantages: control of 8 degrees of freedom of the flexible arm (3 degrees of freedom of a shoulder, 2 degrees of freedom of an elbow and 2 degrees of freedom of a wrist) and a tip gripper (1 degree of freedom) is realized only with a small number of sensors, with a simple wearing operation and convenient use; since the model is simple, a training time is short, a calculation efficiency 1s high, and a response is fast. The flexible-arm-oriented multi-modal human-machine interaction control method is able to be effectively applied to a production environment mainly based on flexible arm operation, and has an application prospect which is extremely broad.BRIEF DESCRIPTION OF THE DRAWINGS [0027] [0027] Fig. 1 is a flow chart of a multi-modal human-machine interaction control method; [0028] [0028] Fig. 2 is a schematic diagram of a worn sensor; [0029] [0029] Fig. 3 is a schematic diagram of an MAV double-threshold detection algorithm; and [0030] [0030] Fig. 4 is a schematic diagram of a flexible arm jointDETAILED DESCRIPTION [0031] [0031] In order to explain technical contents, structural features, objects and effects of the technical solutions in detail, the following detailed description is given with reference to the accompanying drawings in combination with the embodiments. [0032] [0032] Referring to Fig. 1, in a flexible-arm-oriented multi-modal human- machine interaction control method according to a preferred embodiment of the present invention, in the whole control system, required hardware includes an electromyographic bracelet, a nine-axis IMU sensor, an upper machine and a flexible arm. [0033] [0033] The control method includes: [0034] [0034] step 1: selecting the 8-channel electromyographic bracelet with a sampling frequency of 200 HZ and the nine-axis IMU sensor, selecting 5 gestures of fist making, opening, inversion, eversion and a splayed hand, acquiring 120 seconds of sEMG data for each gesture with 5 seconds of rest time existing every 5 seconds of actions in the acquisition process, and counting and denoting the maximum | | Ee electromyographic amplitudes in the inversion and eversion processes as respectively. [0035] [0035] Step 2: extracting a data segment of the action process according to the 5-second action and 5-second rest rule in the acquisition paradigm, and performing a low-pass filtering operation below 45 Hz on the extracted motion data segment using a Butterworth filter. [0036] [0036] Step 3: setting a sliding window to have a size of 250 ms and an incremental window to have a size of 50 ms, extracting three features of a mean absolute value (MAY), zero crossings (ZC) and slope sign changes (SSC) in the sliding window, and normalizing these features. The three features are calculated with the following formula, wherein i represents the Ith channel, / represents a length of the window, k represents the kth piece of data in the window, and sgn() represents a sign function: [0037] [0037] t= (6), [0038] [0038] and an MAV value means that a mean value of absolute values of SEMG values within the window is calculated with i=12.8 and Z=50 ; [0039] [0039] { Kip 7 Xi kat |= €) A (sgn(x, , ta) > False) (7) [0040] [0040] and a ZC value means that if the absolute values of two adjacent sSEMG values are larger than or equal to a threshold € and the adjacent SEMG values have opposite signs, the ZC value within the window is incremented by 1; XX) x, ,)>¢ [0041] [0041] ( 1k het) ( ik et) (8), [0042] [0042] and an SSC value means that if three consecutive SEMG values satisfy the above formula, the SSC value within the window is incremented by 1. [0043] [0043] Since the three features are extracted for each channel, the feature vector input into a classifier has 24 dimensions; that is, an input layer of a neural network has 24 nodes, and before input into the neural network, each feature is also required to be normalized with the following formula: x _X—Hu [0044] [0044] novmlize S (9), [0045] [0045] wherein A represents the mean value of the feature, and ó represents the standard deviation of the feature. [0046] [0046] Five-fold cross validation is performed on 12 groups of parameters including learning rates [0.1, 0.01, 0.001, 0.0001] and hidden layer numbers [50, 100, 150] using a grid search method, and finally, the parameters are determined to include the learning rate 0.1 and the hidden layer number 100. The neural network is trained with a Levenberg optimization algorithm, a model is saved, and the training time is less than 1 second after experimental statistics. [0047] [0047] Step 4: referring to Fig. 2, wearing the electromyographic bracelet at the muscle group (about 5 cm from the elbow joint) of the extensor digitorum, opposite [0048] [0048] Step 5: detecting SEMG and IMU signal data flow, and extracting an active segment of an SEMG signal with a designed MAV double-threshold detection algorithm which specifically includes: taking a mean MAV value of 8-channel SEMG in 1 N 1 L MAV) =d dia N=8L=20 the last 100 ms, i= 4 k=l (10), wherein © 0%“; and setting a . . TH . [0049] [0049] Data of the active segment is preprocessed as in the step 2, features are extracted from the data as in the step 3 and input into the classifier with saved parameters, a prediction result is output, and the relationship between a predicted gesture and a represented instruction is indicated in the following table: Gesture Meaning of instruction Fist making Tip gripper is closed Opening Tip gripper is opened | Sixth joint is rotated by a certain angle Inversion / clockwise Sixth joint is rotated by a certain angle Eversion / / anticlockwise Splayed hand Tip gripper and sixth joint return to zero state [0050] [0050] The rotation angle 0 of the sixth joint has a linear relationship with the og — Eo 8 Ee Eq. ™ 0. absolute value 2“ of the sEMG value: MIC (11), wherein #8 is a limiting angle of the sixth joint. [0051] [0051] Quaternion data [x, v, z, w] transmitted to the upper machine by the IMU is converted into a roll angle, a pitch angle and a yaw angle according to the following formula: 2yw—2xz roll =a tan om) 1-2v —27° 2xw—2vz itch = a tan(————— » G 2x 27° ) [0053] [0053] Since the angles are calculated with the inverse trigonometric function, in order to avoid discontinuity between -180 degrees and 180 degrees, the angles are also required to be subjected to data regularization according to the following rule: IMU = IMU +360°, if AIMU , <—180° and Asgn, #0 IMU, —360°, if AIMU ‚> 180° and Asgn, #0 IMU otherwise [0054] [0054] ’ (2) J. [| „pitch, yaw [0055] [0055] wherein IM "1s (roll, itch, ) a ) at the moment #, 10036] AMU, = IMU, ~ IMU, Asgn, =sgn(IMU,)~sgn(IMU,.)) 5, [0057] [0057] The pitch angle, the yaw angle and the roll angle obtained by data analysis of the nine-axis IMU sensor correspond to the Ist joint angle, the 2nd joint angle and the 3rd joint angle respectively; the pitch angle, the roll angle and the yaw angle obtained by data analysis of the IMU of the electromyographic bracelet correspond to the 4th joint angle, the 5th joint angle and the 7th joint angle respectively. The th joint angle is represented by 6 with / =12..7 and 6 to 6, constitute a 7-dimensional [0058] [0058] Step 6: acquiring a current rotation angle of a motor by an encoder, calculating a joint angular position vector © according to a motor-joint conversion matrix (determined by a mechanical structure of the flexible arm), forming a feedback error AO =0-0 (4) by subtracting the joint angular position vector from the joint vector acquired in the step 5, inputting the feedback error into the following PD controller, and outputting a final moment control signal: | | =K AO +K A®+ g(8)+ f(O . K [0059] [0059] FER d g(©)+/(©) (5), wherein and ka represent proportional and differential parameters of the controller, & ©) is a gravity compensation term, and (©) is a friction compensation term. [0060] [0060] It should be noted that the relational terms herein, such as first, second, or the like, are used solely to distinguish one entity or operation from another entity or operation without necessarily requiring or implying any actual such relationship or order between such entities or operations. Also, the terms “include”, “comprise” or any other variations thereof are intended to cover non-exclusive inclusion, such that a process, method, article, or terminal device including a series of elements includes not only those elements but also other elements not explicitly listed or inherent to such process, method, article, or terminal device. Without further limitation, an element defined by the phrases “including ......” or “comprising .....” does not exclude the presence of additional elements in a process, method, article, or terminal device that includes the element. Furthermore, in the description, “greater than”, “less than”, “more than”, or the like, are understood to not include the present number; the terms “above”, “below”, “within”, or the like, are understood to include the present number. [0061] [0061] Although the embodiments have been described, other variations and modifications of the embodiments may occur to those skilled in the art once they learn of the basic inventive concepts, such that the above description is only for the embodiments of the present invention, and is not intended to limit the scope of the invention; all the equivalent structures or equivalent processes performed by using the -10- contents of the specification and the drawings of the present invention, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.
权利要求:
Claims (9) [1] S11 - Conclusions l. Flexible arm oriented multimodal human machine interaction control method in which a sensor, an upper machine and a flexible arm are adopted, comprising: step 1: obtaining data from an electromyographic signal from a controller, and preprocessing the electromyographic signal; step 2: extracting properties from the preprocessed electromyographic signals to train a classifier model; step 3: wearing and calibrating the sensor by the controller; step 4: loading, by the upper machine, the classifier model trained in step 2, and dynamically analyzing the electromyographic signal and IMU data with a detection algorithm to obtain a joint angle; step 5: inputting joint angle in a proportional-differential controller to control the flexible arm. [2] The flexible arm-oriented multimodal human-machine interaction control method of claim 1, wherein the step | includes: obtaining data from an electromyographic signal from a gesture from the controller, and measuring the maximum electromyographic signal amplitude. [3] The flexible-arm-oriented multimodal human-machine interaction control method of claim 1, wherein pre-processing the electromyographic signal in step 1 comprises: extracting data from an active segment and performing a low-pass filtering operation using a Butterworth filter. [4] The flexible arm-oriented multimodal human-machine interaction control method of claim 1, wherein step 2 comprises: extracting and normalizing a property vector from the electromyographic signal, inputting the property vector into a neural network, and training the model with a Levenberg optimization algorithm. -12- [5] The flexible-arm-oriented multimodal human-machine interaction control method of claim 4, wherein three properties of an average absolute value, zero intersections and slope sign changes are extracted and normalized. [6] The flexible-arm-oriented multimodal human-machine interaction control method of claim 1, wherein step 4 comprises: extracting the active segment from the electromyographic signal with a double-threshold detection algorithm, preprocessing data from the active segment , extracting properties from the data and entering the properties into the classifier model for identification. [7] The flexible arm-oriented multimodal human-machine interaction control method of claim 1, wherein the IMU data is identified in step 4 with an IMU computation model. [8] A flexible-arm-oriented multimodal human-machine interaction control method according to claim 7, wherein quaternion data [x, y, z, w] transmitted to the top machine is converted by the IMU to a roll angle, a pitch angle and a yaw angle according to the following formula: roll = atan (2) 1—2y2 - 2z2. 2xw - 2yz pitch = a tan i yaw = arcsin (2xy + 2zw) (1) and data regularization is performed on the corners according to the following rules: IMU = IMU, + 360 °, if AIMU ,, <—180 ° and Asgn, , # 0 | IMU, - 360 °, if AIMU ,,> 180 ° and Asgn ,, # 0 IMU, + 360 °, otherwise where IMU, (roll, pitch ,, yaw,) is at time n, AIMU, = IMU, - IMU ,, _ ,, Asgn ,, = sgn (IMU,)) - sgn (IMU ,, _;) (3). S13 - [9] The flexible-arm-oriented multimodal human-machine interaction control method according to claim 1, wherein controlling the flexible arm in step 5 comprises: calculating a joint angle position vector ©, creating a feedback error A8 = 8-8 (4 ) by subtracting the joint angle position vector from the joint angle obtained in step 4, entering the feedback error in the proportional differential control and outputting a momentary control signal T = K, A® + K4A®8 + g (0) + f (8) (5), where K 1 and K 4 represent proportional and differential parameters of the control, g (®) is a gravity compensation term and f (©) is a friction compensation term.
类似技术:
公开号 | 公开日 | 专利标题 Oz et al.2011|American sign language word recognition with a sensory glove using artificial neural networks Li et al.2010|Automatic recognition of sign language subwords based on portable accelerometer and EMG sensors CN108363978A|2018-08-03|Using the emotion perception method based on body language of deep learning and UKF CN106293057A|2017-01-04|Gesture identification method based on BP neutral net US10959863B2|2021-03-30|Multi-dimensional surface electromyogram signal prosthetic hand control method based on principal component analysis NL2027179A|2021-04-06|Flexible-arm-oriented multi-modal human-machine interaction control method Bao et al.2020|A CNN-LSTM hybrid model for wrist kinematics estimation using surface electromyography Zhu et al.2008|Wearable sensors based human intention recognition in smart assisted living systems Ma et al.2020|Continuous estimation of upper limb joint angle from sEMG signals based on SCA-LSTM deep learning approach Gil-Martín et al.2020|Improving physical activity recognition using a new deep learning architecture and post-processing techniques CN110610158A|2019-12-24|Human body posture identification method and system based on convolution and gated cyclic neural network Yuan et al.2014|Smartphone-based activity recognition using hybrid classifier Tian et al.2016|MEMS-based human activity recognition using smartphone Yahaya et al.2019|Gesture recognition intermediary robot for abnormality detection in human activities Asai et al.2017|Finger motion estimation based on frequency conversion of EMG signals and image recognition using convolutional neural network Yang et al.2021|Dynamic Gesture Recognition Using Surface EMG Signals Based on Multi-Stream Residual Network Yang et al.2019|Real-time pattern recognition for hand gesture based on ANN and surface EMG CN111476161A|2020-07-31|Somatosensory dynamic gesture recognition method fusing image and physiological signal dual channels CN108255303B|2021-03-16|Gesture recognition method based on self-made data gloves CN109613976A|2019-04-12|A kind of intelligent flexible pressure sensing hand language recognition device Koch et al.2019|Forked recurrent neural network for hand gesture classification using inertial measurement data Yuan et al.2020|Alanet: Autoencoder-lstm for pain and protective behaviour detection Arteaga et al.2020|EMG-based adaptive trajectory generation for an exoskeleton model during hand rehabilitation exercises Sarwat et al.2020|Assessment of Post-Stroke Patients Using Smartphones and Gradient Boosting CN110188669B|2021-01-19|Air handwritten character track recovery method based on attention mechanism
同族专利:
公开号 | 公开日 CN111399640A|2020-07-10|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 EP3333658A1|2016-12-09|2018-06-13|Sick Ag|Controls for secure controlling of at least one machine| CN107553499A|2017-10-23|2018-01-09|上海交通大学|Natural the gesture motion control system and method for a kind of Multi-shaft mechanical arm| CN108388114B|2018-02-07|2021-07-09|中国航空工业集团公司西安飞机设计研究所|Flexible mechanical arm composite control method based on output redefinition| CN110068326B|2019-04-29|2021-11-30|京东方科技集团股份有限公司|Attitude calculation method and apparatus, electronic device, and storage medium|CN112123332A|2020-08-10|2020-12-25|北京海益同展信息科技有限公司|Construction method of gesture classifier, exoskeleton robot control method and device|
法律状态:
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 CN202010148713.8A|CN111399640A|2020-03-05|2020-03-05|Multi-mode man-machine interaction control method for flexible arm| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|